To the Editor:
An article by Tets Maniwa in a recent issue of ISD Magazine has me concerned. The article is "Attending to Signal Integrity in Complex Designs" (November 2000 pg. 62). In this article, towards the end, he mentions, "The development of asynchronous clock drivers for some sections of the design will reduce power surges ...". This will be disastrous! It will reduce average power surges, but will create unpredictable power surges. Sometime, the asynchronous clocks will come into sync randomly. This will be totally uncontrolled and unpredictable.
The probability of complete synchronization is computable. The chip will fail randomly, for no explainable reason, when the clocks happen to come into sync.
Please don't design a chip that is guaranteed to fail occasionally for no explainable reason. It is much better to have all clocks synchronous, and design to handle the power surges. That way there will be no surprises. How can we get this warning out to chip designers so they don't make this fatal mistake?
Ken Hatch, ISD Magazine subscriber
Tets Maniwa replies:
The key words are "some sections." Statistically, only about 10 to 15 percent of a circuit is active during any clock cycle. By moving some of the switching edges out of synchronization with the clock, a significant decrease in power and noise results. The occurrence of all of the active circuitry becoming synchronous would only affect a small percentage of the active circuitry, and wouldn't cause a major brown-out on the chip.
Tets Maniwa, Community Leader, EEdesign.com
To the Editor:
After reading your column ("Is There a Prima Donna in The House?" November 2000 pg. 6) about the anonymous source you talked with at ITC, I just had to write.
As you know, I've spent many, many years-just remember my grey hair-teaching and preaching DFT (and BIST). The audiences are always mainly composed of test engineers trying to figure out how to get hardware designers, and their managers, to take these considerations seriously early enough in the design process to be effective in reducing test development time and manufacturing test costs. This hasn't changed much.
I just did short courses at Wescon and Northcon. Twelve at Wescon-10 test, 2 design: Twenty at Northcon-17 test, 3 design. In the course of these 3-hour sessions, the reason that we had the number of design engineers that we did have became clear: They had indeed been burnt!
So I think your anonymous source is very right about the pitfalls that designers face with ever increasingly complex SOC designs, especially mixed-signal designs, but I fear that his or her optimism could be misplaced.
I predicted-in print-in 1980 that the 1980's would be the "decade of testability." In 1990, I had to admit that I had missed by a decade. This year, I am confidently predicting that the 21st century will be the century of DFT.
On the issue of EDA tools, I agree that the developers are trying their absolute best. The same goes for the developers of BIST technologies, an area that promises to really lower tester and test development costs, if BIST is incorporated during the design phase of an SOC project. On the other hand, consider that expenditures for test-related ATE tools run roughly $100 million per year out of total EDA tool licensing fees in the billions of dollars. There really needs to be some EDA tool-user top-management attention to where the money is being spent if EDA and BIST IP vendors and going to be able to invest in enough development to help your anonymous source's design prima donnas.
As the founder and past president of the ASTE, and a co-founder and past chair of the IEEE 1149 committee, I urge you to keep the DFT and BIST issues at the forefront in ISD Magazine. And to urge your test engineering readers to continue to hope-because DFT and BIST are going to happen.
Jon Turino, Product Marketing Manager, Fluence Technology, Inc.
To voice an opinion on this or any other article in
Integrated System Design, please e-mail your comments to sdean@cmp.comd